416 research outputs found

    Answering Conjunctive Queries under Updates

    Full text link
    We consider the task of enumerating and counting answers to kk-ary conjunctive queries against relational databases that may be updated by inserting or deleting tuples. We exhibit a new notion of q-hierarchical conjunctive queries and show that these can be maintained efficiently in the following sense. During a linear time preprocessing phase, we can build a data structure that enables constant delay enumeration of the query results; and when the database is updated, we can update the data structure and restart the enumeration phase within constant time. For the special case of self-join free conjunctive queries we obtain a dichotomy: if a query is not q-hierarchical, then query enumeration with sublinear^\ast delay and sublinear update time (and arbitrary preprocessing time) is impossible. For answering Boolean conjunctive queries and for the more general problem of counting the number of solutions of k-ary queries we obtain complete dichotomies: if the query's homomorphic core is q-hierarchical, then size of the the query result can be computed in linear time and maintained with constant update time. Otherwise, the size of the query result cannot be maintained with sublinear update time. All our lower bounds rely on the OMv-conjecture, a conjecture on the hardness of online matrix-vector multiplication that has recently emerged in the field of fine-grained complexity to characterise the hardness of dynamic problems. The lower bound for the counting problem additionally relies on the orthogonal vectors conjecture, which in turn is implied by the strong exponential time hypothesis. )^\ast) By sublinear we mean O(n1ε)O(n^{1-\varepsilon}) for some ε>0\varepsilon>0, where nn is the size of the active domain of the current database

    Beyond Worst-Case Analysis for Joins with Minesweeper

    Full text link
    We describe a new algorithm, Minesweeper, that is able to satisfy stronger runtime guarantees than previous join algorithms (colloquially, `beyond worst-case guarantees') for data in indexed search trees. Our first contribution is developing a framework to measure this stronger notion of complexity, which we call {\it certificate complexity}, that extends notions of Barbay et al. and Demaine et al.; a certificate is a set of propositional formulae that certifies that the output is correct. This notion captures a natural class of join algorithms. In addition, the certificate allows us to define a strictly stronger notion of runtime complexity than traditional worst-case guarantees. Our second contribution is to develop a dichotomy theorem for the certificate-based notion of complexity. Roughly, we show that Minesweeper evaluates β\beta-acyclic queries in time linear in the certificate plus the output size, while for any β\beta-cyclic query there is some instance that takes superlinear time in the certificate (and for which the output is no larger than the certificate size). We also extend our certificate-complexity analysis to queries with bounded treewidth and the triangle query.Comment: [This is the full version of our PODS'2014 paper.

    ETEA: A euclidean minimum spanning tree-Based evolutionary algorithm for multiobjective optimization

    Get PDF
    © the Massachusetts Institute of TechnologyAbstract The Euclidean minimum spanning tree (EMST), widely used in a variety of domains, is a minimum spanning tree of a set of points in the space, where the edge weight between each pair of points is their Euclidean distance. Since the generation of an EMST is entirely determined by the Euclidean distance between solutions (points), the properties of EMSTs have a close relation with the distribution and position information of solutions. This paper explores the properties of EMSTs and proposes an EMST-based Evolutionary Algorithm (ETEA) to solve multiobjective optimization problems (MOPs). Unlike most EMO algorithms that focus on the Pareto dominance relation, the proposed algorithm mainly considers distance-based measures to evaluate and compare individuals during the evolutionary search. Specifically in ETEA, four strategies are introduced: 1) An EMST-based crowding distance (ETCD) is presented to estimate the density of individuals in the population; 2) A distance comparison approach incorporating ETCD is used to assign the fitness value for individuals; 3) A fitness adjustment technique is designed to avoid the partial overcrowding in environmental selection; 4) Three diversity indicators-the minimum edge, degree, and ETCD-with regard to EMSTs are applied to determine the survival of individuals in archive truncation. From a series of extensive experiments on 32 test instances with different characteristics, ETEA is found to be competitive against five state-of-the-art algorithms and its predecessor in providing a good balance among convergence, uniformity, and spread.Engineering and Physical Sciences Research Council (EPSRC) of the United Kingdom under Grant EP/K001310/1, and the National Natural Science Foundation of China under Grant 61070088

    Probing photo-ionization: simulations of positive streamers in varying N2:O2 mixtures

    Get PDF
    Photo-ionization is the accepted mechanism for the propagation of positive streamers in air though the parameters are not very well known; the efficiency of this mechanism largely depends on the presence of both nitrogen and oxygen. But experiments show that streamer propagation is amazingly robust against changes of the gas composition; even for pure nitrogen with impurity levels below 1 ppm streamers propagate essentially with the same velocity as in air, but their minimal diameter is smaller, and they branch more frequently. Additionally, they move more in a zigzag fashion and sometimes exhibit a feathery structure. In our simulations, we test the relative importance of photo-ionization and of the background ionization from pulsed repetitive discharges, in air as well as in nitrogen with 1 ppm O2 . We also test reasonable parameter changes of the photo-ionization model. We find that photo- ionization dominates streamer propagation in air for repetition frequencies of at least 1 kHz, while in nitrogen with 1 ppm O2 the effect of the repetition frequency has to be included above 1 Hz. Finally, we explain the feather-like structures around streamer channels that are observed in experiments in nitrogen with high purity, but not in air.Comment: 12 figure

    Probing photo-ionization: Experiments on positive streamers in pure gasses and mixtures

    Get PDF
    Positive streamers are thought to propagate by photo-ionization whose parameters depend on the nitrogen:oxygen ratio. Therefore we study streamers in nitrogen with 20%, 0.2% and 0.01% oxygen and in pure nitrogen, as well as in pure oxygen and argon. Our new experimental set-up guarantees contamination of the pure gases to be well below 1 ppm. Streamers in oxygen are difficult to measure as they emit considerably less light in the sensitivity range of our fast ICCD camera than the other gasses. Streamers in pure nitrogen and in all nitrogen/oxygen mixtures look generally similar, but become somewhat thinner and branch more with decreasing oxygen content. In pure nitrogen the streamers can branch so much that they resemble feathers. This feature is even more pronounced in pure argon, with approximately 10^2 hair tips/cm^3 in the feathers at 200 mbar; this density could be interpreted as the free electron density creating avalanches towards the streamer stem. It is remarkable that the streamer velocity is essentially the same for similar voltage and pressure in all nitrogen/oxygen mixtures as well as in pure nitrogen, while the oxygen concentration and therefore the photo-ionization lengths vary by more than five orders of magnitude. Streamers in argon have essentially the same velocity as well. The physical similarity of streamers at different pressures is confirmed in all gases; the minimal diameters are smaller than in earlier measurements.Comment: 28 pages, 14 figures. Major differences with v1: - appendix and spectra removed - subsection regarding effects of repetition frequency added - many more smaller change

    Probing background ionization: Positive streamers with varying pulse repetition rate and with a radioactive admixture

    Get PDF
    Positive streamers need a source of free electrons ahead of them to propagate. A streamer can supply these electrons by itself through photo-ionization, or the electrons can be present due to external background ionization. Here we investigate the effects of background ionization on streamer propagation and morphology by changing the gas composition and the repetition rate of the voltage pulses, and by adding a small amount of radioactive Krypton 85. We find that the general morphology of a positive streamer discharge in high purity nitrogen depends on background ionization: at lower background ionization levels the streamers branch more and have a more feather-like appearance. This is observed both when varying the repetition rate and when adding Krypton 85, though side branches are longer with the radioactive admixture. But velocities and minimal diameters of streamers are virtually independent of the background ionization level. In air, the inception cloud breaks up into streamers at a smaller radius when the repetition rate and therefore the background ionization level is higher. When measuring the effects of the pulse repetition rate and of the radioactive admixture on the discharge morphology, we found that our estimates of background ionization levels are consistent with these observations; this gives confidence in the estimates. Streamer channels generally do not follow the paths of previous discharge channels for repetition rates of up to 10 Hz. We estimate the effect of recombination and diffusion of ions and free electrons from the previous discharge and conclude that the old trail has largely disappeared at the moment of the next voltage pulse; therefore the next streamers indeed cannot follow the old trail.Comment: 30 pages, 13 figure

    Surfactant status and respiratory outcome in premature infants receiving late surfactant treatment.

    Get PDF
    BACKGROUND:Many premature infants with respiratory failure are deficient in surfactant, but the relationship to occurrence of bronchopulmonary dysplasia (BPD) is uncertain. METHODS:Tracheal aspirates were collected from 209 treated and control infants enrolled at 7-14 days in the Trial of Late Surfactant. The content of phospholipid, surfactant protein B, and total protein were determined in large aggregate (active) surfactant. RESULTS:At 24 h, surfactant treatment transiently increased surfactant protein B content (70%, p < 0.01), but did not affect recovered airway surfactant or total protein/phospholipid. The level of recovered surfactant during dosing was directly associated with content of surfactant protein B (r = 0.50, p < 0.00001) and inversely related to total protein (r = 0.39, p < 0.0001). For all infants, occurrence of BPD was associated with lower levels of recovered large aggregate surfactant, higher protein content, and lower SP-B levels. Tracheal aspirates with lower amounts of recovered surfactant had an increased proportion of small vesicle (inactive) surfactant. CONCLUSIONS:We conclude that many intubated premature infants are deficient in active surfactant, in part due to increased intra-alveolar metabolism, low SP-B content, and protein inhibition, and that the severity of this deficit is predictive of BPD. Late surfactant treatment at the frequency used did not provide a sustained increase in airway surfactant

    FAQ

    Full text link

    Qualitative Evaluation of Common Quantitative Metrics for Clinical Acceptance of Automatic Segmentation:a Case Study on Heart Contouring from CT Images by Deep Learning Algorithms

    Get PDF
    Organs-at-risk contouring is time consuming and labour intensive. Automation by deep learning algorithms would decrease the workload of radiotherapists and technicians considerably. However, the variety of metrics used for the evaluation of deep learning algorithms make the results of many papers difficult to interpret and compare. In this paper, a qualitative evaluation is done on five established metrics to assess whether their values correlate with clinical usability. A total of 377 CT volumes with heart delineations were randomly selected for training and evaluation. A deep learning algorithm was used to predict the contours of the heart. A total of 101 CT slices from the validation set with the predicted contours were shown to three experienced radiologists. They examined each slice independently whether they would accept or adjust the prediction and if there were (small) mistakes. For each slice, the scores of this qualitative evaluation were then compared with the Sørensen-Dice coefficient (DC), the Hausdorff distance (HD), pixel-wise accuracy, sensitivity and precision. The statistical analysis of the qualitative evaluation and metrics showed a significant correlation. Of the slices with a DC over 0.96 (N = 20) or a 95% HD under 5 voxels (N = 25), no slices were rejected by the readers. Contours with lower DC or higher HD were seen in both rejected and accepted contours. Qualitative evaluation shows that it is difficult to use common quantification metrics as indicator for use in clinic. We might need to change the reporting of quantitative metrics to better reflect clinical acceptance
    corecore